Kernel Methods for Nonlinear Discriminative Data Analysis

نویسندگان

  • Xiuwen Liu
  • Washington Mio
چکیده

Optimal Component Analysis (OCA) is a linear subspace technique for dimensionality reduction designed to optimize object classification and recognition performance. The linear nature of OCA often limits recognition performance, if the underlying data structure is nonlinear or cluster structures are complex. To address these problems, we investigate a kernel analogue of OCA, which consists of applying OCA techniques to the data after it has been mapped nonlinearly into a new feature space, typically a high (possibly infinite) dimensional Hilbert space. In this paper, we study both the theoretical and algorithmic aspects of the problem and report results obtained in several object recognition experiments.

برای دانلود رایگان متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

Learning Discriminative Metrics via Generative Models and Kernel Learning

Metrics specifying distances between data points can be learned in a discriminative manner or fromgenerative models. In this paper, we show how to unify generative and discriminative learning of met-rics via a kernel learning framework. Specifically, we learn local metrics optimized from parametricgenerative models. These are then used as base kernels to construct a global kerne...

متن کامل

Discriminative Dimensionality Reduction in Kernel Space

Modern nonlinear dimensionality reduction (DR) techniques enable an efficient visual data inspection in the form of scatter plots, but they suffer from the fact that DR is inherently ill-posed. Discriminative dimensionality reduction (DiDi) offers one remedy, since it allows a practitioner to identify what is relevant and what should be regarded as noise by means of auxiliary information such a...

متن کامل

Local Image Descriptors Using Supervised Kernel ICA

PCA-SIFT is an extension to SIFT which aims to reduce SIFT’s high dimensionality (128 dimensions) by applying PCA to the gradient image patches. However PCA is not a discriminative representation for recognition due to its global feature nature and unsupervised algorithm. In addition, linear methods such as PCA and ICA can fail in the case of non-linearity. In this paper, we propose a new discr...

متن کامل

Face Recognition Based on Optimal Kernel Minimax Probability Machine

Face recognition has received extensive attention due to its potential applications in many fields. To effectively deal with this problem, a novel face recognition algorithm is proposed by using the optimal kernel minimax probability machine. The key idea of the algorithm is as follows: First, the discriminative facial features are extracted with local fisher discriminant analysis (LFDA). Then,...

متن کامل

Kernelized Support Tensor Machines

In the context of supervised tensor learning, preserving the structural information and exploiting the discriminative nonlinear relationships of tensor data are crucial for improving the performance of learning tasks. Based on tensor factorization theory and kernel methods, we propose a novel Kernelized Support Tensor Machine (KSTM) which integrates kernelized tensor factorization with maximum-...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

عنوان ژورنال:

دوره   شماره 

صفحات  -

تاریخ انتشار 2005